Online Douglas-Rachford splitting method

نویسنده

  • Ziqiang Shi
چکیده

Online and stochastic learning has emerged as powerful tool in large scale optimization. In this work, we generalize the Douglas-Rachford splitting (DRs) method for minimizing composite functions to online and stochastic settings (to our best knowledge this is the first time DRs been generalized to sequential version). We first establish an O(1/ √ T ) regret bound for batch DRs method. Then we proved that the online DRs splitting method enjoy an O(1) regret bound and stochastic DRs splitting has a convergence rate of O(1/ √ T ). The proof is simple and intuitive, and the results and technique can be served as a initiate for the research on the large scale machine learning employ the DRs method. Numerical experiments of the proposed method demonstrate the effectiveness of the online and stochastic update rule, and further confirm our regret and convergence analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators

This paper shows, by means of a new type of operator called a splitting operator, that the Douglas-Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm. Therefore, applications of Douglas-Rachford splitting, such as the alternating direction method of multipliers for convex programming decomposition, are also special...

متن کامل

A New Use of Douglas-Rachford Splitting and ADMM for Identifying Infeasible, Unbounded, and Pathological Conic Programs

In this paper, we present a method for identifying infeasible, unbounded, and pathological conic programs based on Douglas-Rachford splitting, or equivalently ADMM. When an optimization program is infeasible, unbounded, or pathological, the iterates of Douglas-Rachford splitting diverge. Somewhat surprisingly, such divergent iterates still provide useful information, which our method uses for i...

متن کامل

On convergence rate of the Douglas-Rachford operator splitting method

This note provides a simple proof on a O(1/k) convergence rate for the DouglasRachford operator splitting method where k denotes the iteration counter.

متن کامل

Douglas-Rachford Splitting for Cardinality Constrained Quadratic Programming

In this report, we study the class of Cardinality Constrained Quadratic Programs (CCQP), problems with (not necessarily convex) quadratic objective and cardinality constraints. Many practical problems of importance can be formulated as CCQPs. Examples include sparse principal component analysis [1], [2], cardinality constrained mean-variance portfolio selection problem [3]–[5], subset selection...

متن کامل

Metric Selection in Douglas-Rachford Splitting and ADMM

Recently, several convergence rate results for Douglas-Rachford splitting and the alternating direction method of multipliers (ADMM) have been presented in the literature. In this paper, we show linear convergence of Douglas-Rachford splitting and ADMM under certain assumptions. We also show that the provided bounds on the linear convergence rates generalize and/or improve on similar bounds in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1308.4757  شماره 

صفحات  -

تاریخ انتشار 2013